Illuminating the Path
The Research and Development Agenda for Visual Analytics

Executive Summary CH6

Edited by Ben. Turn to ContentPage

Get the knowledge flowing and circulating! :)

目录

Moving Research into Practice (Chapter 6)

To truly leverage the successful research results described by this agenda, these results must be moved into practice. They must be deployed and used to address the national security and analysis needs of the country.

The issues associated with moving research into practice are often omitted from R&D agendas of this type. However, this panel felt compelled to provide a frame-work for four fundamental issues associated with accelerating the process of getting technology into the hands of users. Each of these issues has the potential to make or break the successful deployment of the new technologies we are recommending.

First and foremost, the resulting tools, algorithms, and approaches must be evaluated to ensure that they represent a significant advance over current practice and to ensure that they operate correctly. Second, issues of security and privacy must be addressed from the start and throughout the research, development, and deployment process. Third, software interoperability, architecture, and data handling must be attended to in order to facilitate collaborative research, software evaluation, and software deployment into a wide variety of software environments. Finally, a concerted and sustained effort to insert the resulting technology into operational environments will be essential if the research results are to be of benefit.

The panel recommends several actions to accelerate the transitioning of research into practice.

Recommendation

Develop an infrastructure to facilitate evaluation of new visual analytics technologies.

All too often we develop and deploy technology that has not been evaluated within the contexts of its intended use. This is especially true when dealing with the bridge between unclassified and classified applications. We need common methods and measures for evaluation, with a focus not only on performance but also on utility.

Evaluation is an iterative process that will require a support infrastructure in order to succeed. It begins with evaluations of research done by the inventors themselves. Good sources of unclassified test data will be required to support this evaluation. The most promising research will mature through further stages of development and refinement and will be combined with other technologies, with progressively more sophisticated evaluations conducted in unclassified visual analytics test beds that will be established to approximate the target deployment environment. Conducting these evaluations will require a test bed infrastructure with more representative, but still unclassified, test data streams to use for evaluation. Ultimately, tools will be evaluated in technology insertion facilities that directly replicate the target production environments, which will require close collaboration among government and research communities. The lessons learned throughout the evaluation process should be captured from this process and shared throughout the community.

Recommendation

Create and use a common security and privacy infrastructure, with support for incorporating privacy-supporting technologies, such as data minimization and data anonymization.

Protecting confidentiality and data integrity to ensure privacy and security is a key objective of DHS. As stated in their Strategic Plan [DHS, 2004], “We will ensure the technologies employed sustain, and do not erode, privacy protections relating to the collection, use, and disclosure of personal information. We will eliminate inappropriate access to confidential data to preserve the privacy of Americans. We will maintain an appropriate balance between freedom and safety consistent with the values of our society."

The goal of visual analytics R&D is to create fundamentally new ways for people to understand and act upon the data available to them. However, this must be done within a framework that fully considers and supports the need for privacy in all phases of the work, from the earliest research stages to the deployment phase.

To make attention to privacy a natural and routine part of the visual analytics R&D process, we need to adopt a standard suite of anonymization technologies and make these available to the visual analytics research community. We further recommend that all researchers in visual analytics receive training so that they clearly understand privacy and security laws and policies and do not inadvertently invent technologies or use data that violate these laws and policies.

Recommendation

Use a common component-based software development approach for visual analytics software to facilitate evaluation of research results in integrated prototypes and deployment of promising components in diverse operational environments.

Software interoperability is important to the visual analytics R&D effort. Initially, complementary technologies created by different research teams will be evaluated together in test beds to determine how best to deploy them. Ultimately, though, the most promising breakthrough technologies are likely to have broad applicability and thus will be candidates for deployment into diverse analyst-focused systems in use within DHS and other government agencies. The only effective path to rapid and cost-effective deployment of new technologies is to develop them in the form of reusable software components.

Recommendation

Identify and publicize best practices for inserting visual analytics technologies into operational environments.

One measure of success for this R&D agenda is the extent to which the resulting research matures into software that finds broad usage. The process of transitioning software into wide analytical use is complex, and it requires the cooperative efforts of researchers, software engineers, systems infrastructure and operations staff, training and support staff, and the users themselves. Although the process can be difficult, there are examples of successful transitions that provide important lessons and guideposts for future technology insertion efforts. By identifying and publicizing these best practices, we can help speed the transition of the next generation of innovative research into the user's hands.

 


 

将研究转化为实践(第六章)

要真正利用本议程描述的成功研究成果,必须将这些成果付诸实践。必须部署和使用它们来解决该国的国家安全和分析需求。

与将研究付诸实践相关的问题在此类研发议程中经常被忽略。 但是,该小组感到有必要(compelled)为与加速将技术交到用户手中的过程相关的四个基本问题提供一个框架。 这些问题中的每一个都有可能影响或破坏我们推荐的新技术的成功部署。

首先也是最重要的是,必须对由此产生的工具、算法和方法进行评估,以确保它们代表了当前实践的重大进步,并确保它们能够正确运行。 其次,安全和隐私问题必须从一开始就在整个研究、开发和部署过程中得到解决。 第三,必须注意软件互操作性、体系结构和数据处理,以促进协作研究、软件评估和软件部署到各种软件环境中。 最后,如果要使研究结果受益,则将所产生的技术融入作战环境的协调一致和持续的努力将是必不可少的。 该小组建议采取多项行动来加速研究向实践的转变。

建议

开发基础设施以促进对新的可视化分析技术的评估。

我们经常开发和部署未在其预期用途范围内进行评估的技术。 在处理未分类和分类应用程序之间的桥梁时尤其如此。 我们需要通用的评估方法和措施,不仅要关注绩效,还要关注效用。

评估是一个迭代过程,需要支持基础设施才能成功。 它从对发明者自己所做的研究的评估开始。 需要良好的未分类测试数据来源来支持此评估。 最有前途的研究将通过进一步的开发和完善阶段成熟,并将与其他技术相结合,在非机密的视觉分析试验台上进行越来越复杂的评估,这些试验台将被建立以接近目标部署环境。 进行这些评估将需要一个测试台基础设施,该基础设施具有更具代表性但仍未分类的测试数据流以用于评估。 最终,工具将在直接复制目标生产环境的技术插入设施中进行评估,这将需要政府和研究团体之间的密切合作。 在整个评估过程中吸取的经验教训应该从这个过程中获取并在整个社区中共享。

建议

创建和使用通用的安全和隐私基础设施,支持合并隐私支持技术,例如数据最小化和数据匿名化。

保护机密性和数据完整性以确保隐私和安全是国土安全部的一个关键目标。正如他们的战略计划 [DHS,2004 年] 所述,“我们将确保所采用的技术维持而不是削弱与收集相关的隐私保护, 使用和披露个人信息。 我们将消除对机密数据的不当访问,以保护美国人的隐私。 我们将根据我们社会的价值观在自由和安全之间保持适当的平衡。”

可视化分析研发的目标是从根本上创造新的方式,让人们理解可用的数据并据此采取行动。 但是,这必须在一个框架内完成,该框架充分考虑并支持从最早的研究阶段到部署阶段的所有工作阶段的隐私需求。

为了使对隐私的关注成为视觉分析研发过程中自然而常规的一部分,我们需要采用一套标准的匿名化技术,并将这些技术提供给视觉分析研究社区。 我们进一步建议所有视觉分析研究人员接受培训,以便他们清楚地了解隐私和安全法律和政策,并且不会无意中(inadvertently)发明技术或使用违反这些法律和政策的数据。

建议

对可视化分析软件使用基于组件的通用软件开发方法,以促进集成原型中研究结果的评估以及在不同操作环境中部署有前途的组件。

软件互操作性对于可视化分析研发工作很重要。 最初,不同研究团队创建的互补技术将在试验台上一起进行评估,以确定如何最好地部署它们。 但最终,最有前途的突破性技术可能具有广泛的适用性,因此将成为部署到国土安全部和其他政府机构使用的各种以分析师为中心的系统的候选技术。 快速且经济高效地部署新技术的唯一有效途径是以可重用软件组件的形式开发它们。

建议

确定并宣传(publicize)将可视化分析技术插入操作环境的最佳实践。

衡量此研发议程成功与否的一个指标是所产生的研究在多大程度上成熟为可广泛使用的软件。 将软件转化为广泛的分析用途的过程是复杂的,需要研究人员、软件工程师、系统基础设施和操作人员、培训和支持人员以及用户本身的通力合作。 虽然这个过程可能很困难,但有一些成功的例子可以为未来的技术插入工作提供重要的经验教训和指导。 通过确定和宣传这些最佳实践,我们可以帮助加快将下一代创新研究转移到用户手中的速度。